skip to main content
US FlagAn official website of the United States government
dot gov icon
Official websites use .gov
A .gov website belongs to an official government organization in the United States.
https lock icon
Secure .gov websites use HTTPS
A lock ( lock ) or https:// means you've safely connected to the .gov website. Share sensitive information only on official, secure websites.


Search for: All records

Creators/Authors contains: "Jurić, Mario"

Note: When clicking on a Digital Object Identifier (DOI) number, you will be taken to an external site maintained by the publisher. Some full text articles may not yet be available without a charge during the embargo (administrative interval).
What is a DOI Number?

Some links on this page may take you to non-federal websites. Their policies may differ from this site.

  1. Abstract The boundary of solar system object discovery lies in detecting its faintest members. However, their discovery in detection catalogs from imaging surveys is fundamentally limited by the practice of thresholding detections at signal-to-noise (SNR) ≥ 5 to maintain catalog purity. Faint moving objects can be recovered from survey images using the shift-and-stack algorithm, which coadds pixels from multi-epoch images along a candidate trajectory. Trajectories matching real objects accumulate signal coherently, enabling high-confidence detections of very faint moving objects. Applying shift-and-stack comes with high computational cost, which scales with target object velocity, typically limiting its use to searches for slow-moving objects in the outer solar system. This work introduces a modified shift-and-stack algorithm that trades sensitivity for speedup. Our algorithm stacks low-SNR detection catalogs instead of pixels, the sparsity of which enables approximations that reduce the number of stacks required. Our algorithm achieves real-world speedups of 10–103× over image-based shift-and-stack while retaining the ability to find faint objects. We validate its performance by recovering synthetic inner and outer solar system objects injected into images from the DECam Ecliptic Exploration Project. Exploring the sensitivity–compute time trade-off of this algorithm, we find that our method achieves a speedup of ∼30× with 88% of the memory usage while sacrificing 0.25 mag in depth compared to image-based shift-and-stack. These speedups enable the broad application of shift-and-stack to large-scale imaging surveys and searches for faint inner solar system objects. We provide a reference implementation via thefind-asteroidsPython package and this URL:https://github.com/stevenstetzler/find-asteroids. 
    more » « less
    Free, publicly-accessible full text available November 26, 2026
  2. Abstract We present a scalable, cloud-based science platform solution designed to enable next-to-the-data analyses of terabyte-scale astronomical tabular data sets. The presented platform is built on Amazon Web Services (over Kubernetes and S3 abstraction layers), utilizes Apache Spark and the Astronomy eXtensions for Spark for parallel data analysis and manipulation, and provides the familiar JupyterHub web-accessible front end for user access. We outline the architecture of the analysis platform, provide implementation details and rationale for (and against) technology choices, verify scalability through strong and weak scaling tests, and demonstrate usability through an example science analysis of data from the Zwicky Transient Facility’s 1Bn+ light-curve catalog. Furthermore, we show how this system enables an end user to iteratively build analyses (in Python) that transparently scale processing with no need for end-user interaction. The system is designed to be deployable by astronomers with moderate cloud engineering knowledge, or (ideally) IT groups. Over the past 3 yr, it has been utilized to build science platforms for the DiRAC Institute, the ZTF partnership, the LSST Solar System Science Collaboration, and the LSST Interdisciplinary Network for Collaboration and Computing, as well as for numerous short-term events (with over 100 simultaneous users). In a live demo instance, the deployment scripts, source code, and cost calculators are accessible.44http://hub.astronomycommons.org/ 
    more » « less
  3. Abstract New mass estimates and cumulative mass profiles with Bayesian credible regions for the Milky Way (MW) are found using the Galactic Mass Estimator (GME) code and dwarf galaxy (DG) kinematic data from multiple sources. GME takes a hierarchical Bayesian approach to simultaneously estimate the true positions and velocities of the DGs, their velocity anisotropy, and the model parameters for the Galaxy’s total gravitational potential. In this study, we incorporate meaningful prior information from past studies and simulations. The prior distributions for the physical model are informed by the results of Eadie & Jurić, who used globular clusters instead of DGs, as well as by the subhalo distributions of the Ananke Gaia-like surveys from Feedback in Realistic Environments-2 cosmological simulations (see Sanderson et al.). Using DGs beyond 45 kpc, we report median and 95% credible region estimates forr200= 212.8 (191.12, 238.44) kpc, and for the total enclosed massM200= 1.19 (0.87, 1.68) × 1012M(adopting Δc= 200). Median mass estimates at specific radii are also reported (e.g.,M(< 50 kpc) = 0.52 × 1012MandM(100 kpc) = 0.78 × 1012M). Estimates are comparable to other recent studies using Gaia DR2 and DGs, but notably different from the estimates of Eadie & Jurić. We perform a sensitivity analysis to investigate whether individual DGs and/or a more massive Large Magellanic Cloud on the order of 1011Mmay be affecting our mass estimates. We find possible supporting evidence for the idea that some DGs are affected by a massive LMC and are not in equilibrium with the MW. 
    more » « less
  4. Abstract The Vera C. Rubin Observatory is expected to start the Legacy Survey of Space and Time (LSST) in early to mid-2025. This multiband wide-field synoptic survey will transform our view of the solar system, with the discovery and monitoring of over five million small bodies. The final survey strategy chosen for LSST has direct implications on the discoverability and characterization of solar system minor planets and passing interstellar objects. Creating an inventory of the solar system is one of the four main LSST science drivers. The LSST observing cadence is a complex optimization problem that must balance the priorities and needs of all the key LSST science areas. To design the best LSST survey strategy, a series of operation simulations using the Rubin Observatory scheduler have been generated to explore the various options for tuning observing parameters and prioritizations. We explore the impact of the various simulated LSST observing strategies on studying the solar system’s small body reservoirs. We examine what are the best observing scenarios and review what are the important considerations for maximizing LSST solar system science. In general, most of the LSST cadence simulations produce ±5% or less variations in our chosen key metrics, but a subset of the simulations significantly hinder science returns with much larger losses in the discovery and light-curve metrics. 
    more » « less
  5. Abstract We present a detailed study of the observational biases of the DECam Ecliptic Exploration Project’s B1 data release and survey simulation software that enables direct statistical comparisons between models and our data. We inject a synthetic population of objects into the images, and then subsequently recover them in the same processing as our real detections. This enables us to characterize the survey’s completeness as a function of apparent magnitudes and on-sky rates of motion. We study the statistically optimal functional form for the magnitude, and develop a methodology that can estimate the magnitude and rate efficiencies for all survey’s pointing groups simultaneously. We have determined that our peak completeness is on average 80% in each pointing group, and our magnitude drops to 25% of this value atm25= 26.22. We describe the freely available survey simulation software and its methodology. We conclude by using it to infer that our effective search area for objects at 40 au is 14.8 deg2, and that our lack of dynamically cold distant objects means that there at most 8 × 103objects with 60 <a< 80 au and absolute magnitudesH≤ 8. 
    more » « less
  6. Abstract We present the first set of trans-Neptunian objects (TNOs) observed on multiple nights in data taken from the DECam Ecliptic Exploration Project. Of these 110 TNOs, 105 do not coincide with previously known TNOs and appear to be new discoveries. Each individual detection for our objects resulted from a digital tracking search at TNO rates of motion, using two-to-four-hour exposure sets, and the detections were subsequently linked across multiple observing seasons. This procedure allows us to find objects with magnitudesmVR≈ 26. The object discovery processing also included a comprehensive population of objects injected into the images, with a recovery and linking rate of at least 94%. The final orbits were obtained using a specialized orbit-fitting procedure that accounts for the positional errors derived from the digital tracking procedure. Our results include robust orbits and magnitudes for classical TNOs with absolute magnitudesH∼ 10, as well as a dynamically detached object found at 76 au (semimajor axisa≈ 77 au). We find a disagreement between our population of classical TNOs and the CFEPS-L7 three-component model for the Kuiper Belt. 
    more » « less
  7. ABSTRACT This paper presents a new optical imaging survey of four deep drilling fields (DDFs), two Galactic and two extragalactic, with the Dark Energy Camera (DECam) on the 4-m Blanco telescope at the Cerro Tololo Inter-American Observatory (CTIO). During the first year of observations in 2021, >4000 images covering 21 deg2 (seven DECam pointings), with ∼40 epochs (nights) per field and 5 to 6 images per night per filter in g, r, i, and/or z have become publicly available (the proprietary period for this program is waived). We describe the real-time difference-image pipeline and how alerts are distributed to brokers via the same distribution system as the Zwicky Transient Facility (ZTF). In this paper, we focus on the two extragalactic deep fields (COSMOS and ELAIS-S1) characterizing the detected sources, and demonstrating that the survey design is effective for probing the discovery space of faint and fast variable and transient sources. We describe and make publicly available 4413 calibrated light curves based on difference-image detection photometry of transients and variables in the extragalactic fields. We also present preliminary scientific analysis regarding the Solar system small bodies, stellar flares and variables, Galactic anomaly detection, fast-rising transients and variables, supernovae, and active Galactic nuclei. 
    more » « less
  8. Abstract Developing sustainable software for the scientific community requires expertise in software engineering and domain science. This can be challenging due to the unique needs of scientific software, the insufficient resources for software engineering practices in the scientific community, and the complexity of developing for evolving scientific contexts. While open‐source software can partially address these concerns, it can introduce complicating dependencies and delay development. These issues can be reduced if scientists and software developers collaborate. We present a case study wherein scientists from the SuperNova Early Warning System collaborated with software developers from the Scalable Cyberinfrastructure for Multi‐Messenger Astrophysics project. The collaboration addressed the difficulties of open‐source software development, but presented additional risks to each team. For the scientists, there was a concern of relying on external systems and lacking control in the development process. For the developers, there was a risk in supporting a user‐group while maintaining core development. These issues were mitigated by creating a second Agile Scrum framework in parallel with the developers' ongoing Agile Scrum process. This Agile collaboration promoted communication, ensured that the scientists had an active role in development, and allowed the developers to evaluate and implement the scientists' software requirements. The collaboration provided benefits for each group: the scientists actuated their development by using an existing platform, and the developers utilized the scientists' use‐case to improve their systems. This case study suggests that scientists and software developers can avoid scientific computing issues by collaborating and that Agile Scrum methods can address emergent concerns. 
    more » « less